421 research outputs found

    SARDSRN: A NEURAL NETWORK SHIFT-REDUCE PARSER

    Get PDF
    Simple Recurrent Networks (SRNs) have been widely used in natural language tasks. SARDSRN extends the SRN by explicitly representing the input sequence in a SARDNET self-organizing map. The distributed SRN component leads to good generalization and robust cognitive properties, whereas the SARDNET map provides exact representations of the sentence constituents. This combination allows SARDSRN to learn to parse sentences with more complicated structure than can the SRN alone, and suggests that the approach could scale up to realistic natural language

    Tilt Aftereffects in a Self-Organizing Model of the Primary Visual Cortex

    Get PDF
    RF-LISSOM, a self-organizing model of laterally connected orientation maps in the primary visual cortex, was used to study the psychological phenomenon known as the tilt aftereffect. The same self-organizing processes that are responsible for the long-term development of the map are shown to result in tilt aftereffects over short time scales in the adult. The model permits simultaneous observation of large numbers of neurons and connections, making it possible to relate high-level phenomena to low-level events, which is difficult to do experimentally. The results give detailed computational support for the long-standing conjecture that the direct tilt aftereffect arises from adaptive lateral interactions between feature detectors. They also make a new prediction that the indirect effect results from the normalization of synaptic efficacies during this process. The model thus provides a unified computational explanation of self-organization and both the direct and indirect tilt aftereffect in the primary visual cortex

    Hebbian Learning and Temporary Storage in the Convergence-Zone Model of Episodic Memory

    Get PDF
    The Convergence-Zone model shows how sparse, random memory patterns can lead to one-shot storage and high capacity in the hippocampal component of the episodic memory system. This paper presents a biologically more realistic version of the model, with continuously-weighted connections and storage through Hebbian learning and normalization. In contrast to the gradual weight adaptation in many neural network models, episodic memory turns out to require high learning rates. Normalization allows earlier patterns to be overwritten, introducing time-dependent forgetting similar to the hippocampus
    • …
    corecore